On the Complexity of Robust PCA and ℓ1-norm Low-Rank Matrix Approximation

نویسندگان

  • Nicolas Gillis
  • Stephen A. Vavasis
چکیده

The low-rank matrix approximation problem with respect to the component-wise l1-norm (l1LRA), which is closely related to robust principal component analysis (PCA), has become a very popular tool in data mining and machine learning. Robust PCA aims at recovering a low-rank matrix that was perturbed with sparse noise, with applications for example in foreground-background video separation. Although l1-LRA is strongly believed to be NP-hard, there is, to the best of our knowledge, no formal proof of this fact. In this paper, we prove that l1-LRA is NP-hard, already in the rank-one case, using a reduction from MAX CUT. Our derivations draw interesting connections between l1-LRA and several other well-known problems, namely, robust PCA, l0-LRA, binary matrix factorization, a particular densest bipartite subgraph problem, the computation of the cut norm of {−1,+1} matrices, and the discrete basis problem, which we all prove to be NP-hard.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Fast Automatic Background Extraction via Robust PCA

Recent years have seen an explosion of interest in applications of sparse signal recovery and low rank matrix completion, due in part to the compelling use of the nuclear norm as a convex proxy for matrix rank. In some cases, minimizing the nuclear norm is equivalent to minimizing the rank of a matrix, and can lead to exact recovery of the underlying rank structure, see [Faz02, RFP10] for backg...

متن کامل

CS 267 Final Project: Parallel Robust PCA

Principal Component Analysis (PCA; Pearson, 1901) is a widely used method for data compression. The goal is to find the best low rank approximation of a given matrix, as judged by minimization of the `2 norm of the difference between the original matrix and the low rank approximation. However, the classical method is not resistant to corruption of individual input data points. Recently, a robus...

متن کامل

Efficient l1-Norm-Based Low-Rank Matrix Approximations for Large-Scale Problems Using Alternating Rectified Gradient Method

Low-rank matrix approximation plays an important role in the area of computer vision and image processing. Most of the conventional low-rank matrix approximation methods are based on the l2 -norm (Frobenius norm) with principal component analysis (PCA) being the most popular among them. However, this can give a poor approximation for data contaminated by outliers (including missing data), becau...

متن کامل

L1-norm Principal-Component Analysis in L2-norm-reduced-rank Data Subspaces

Standard Principal-Component Analysis (PCA) is known to be very sensitive to outliers among the processed data. On the other hand, in has been recently shown that L1-norm-based PCA (L1-PCA) exhibits sturdy resistance against outliers, while it performs similar to standard PCA when applied to nominal or smoothly corrupted data. Exact calculation of the K L1-norm Principal Components (L1-PCs) of ...

متن کامل

Generalised Scalable Robust Principal Component Analysis

The robust estimation of the low-dimensional subspace that spans the data from a set of high-dimensional, possibly corrupted by gross errors and outliers observations is fundamental in many computer vision problems. The state-of-the-art robust principal component analysis (PCA) methods adopt convex relaxations of `0 quasi-norm-regularised rank minimisation problems. That is, the nuclear norm an...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1509.09236  شماره 

صفحات  -

تاریخ انتشار 2015